-
Notifications
You must be signed in to change notification settings - Fork 9
DM-48701: Fix incorrect footprint merging in diffim #379
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
fred3m
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good except that I made a request to make sure that the test looks for all duplicate peaks, not just the one from the mergeFootprintSet bug.
| # The original bug manifested as the (175,124) source disappaering in | ||
| # the merge step, and being included in the peaks of a duplicated | ||
| # (75,124) source, even though their footprints are not connected. | ||
| mask = np.isclose(result.diaSources["slot_Centroid_x"], 75) & \ | ||
| np.isclose(result.diaSources["slot_Centroid_y"], 124) | ||
| self.assertEqual(mask.sum(), 1) | ||
| peaks = result.diaSources[mask][0].getFootprint().peaks | ||
| self.assertEqual(len(peaks), 1) | ||
| self.assertEqual(peaks[0].getIx(), 75) | ||
| self.assertEqual(peaks[0].getIy(), 124) | ||
|
|
||
| mask = np.isclose(result.diaSources["slot_Centroid_x"], 175) & \ | ||
| np.isclose(result.diaSources["slot_Centroid_y"], 124) | ||
| self.assertEqual(mask.sum(), 1) | ||
| peaks = result.diaSources[mask][0].getFootprint().peaks | ||
| self.assertEqual(len(peaks), 1) | ||
| self.assertEqual(peaks[0].getIx(), 175) | ||
| self.assertEqual(peaks[0].getIy(), 124) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure that this is the most useful test anymore. While it did catch the one specific bug in mergeFootprintSets it doesn't check for the general problem of having duplicate peaks. I would suggest something like
peak_x = np.array([peak['f_x'] for src in catalog[catalog["parent"]==0] for peak in src.getFootprint().peaks])
peak_y = np.array([peak['f_y'] for src in catalog[catalog["parent"]==0] for peak in src.getFootprint().peaks])
for i in range(len(peak_x)-1):
for j in range(i, len(peak_x)):
self.assertFalse((peak_x[i] == peak_x[j]) & (peak_y[i] == peak_y[j]))
Note: this code is untested and there is probably a way to vectorize it using np.unique. Something like (also untested)
coords = np.column_stack((peak_x, peak_y))
unique_coords, counts = np.unique(coords, axis=0, return_counts=True)
self.assertEqual(np.sum(counts > 1), 0)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Fred, that was a nearly correct bit of code you produced there! I've added that and checked that it does fail (with one count==2) for the original bug condition, but left the original tests with a comment about why.
ebellm
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
one small bug to fix
d614789 to
c17c50e
Compare
This is a change I somehow didn't make on DM-42696, but should have, to demonstrate that we really are deblending in the "easy" case. That ticket made it so that we could test against a proper truth catalog, but never implemented said test.
This avoids the (unresolved) bug in merge(), and leaves the positive footprints unmerged in the output catalog. It does require an API change to processResults, since we now have to use SourceCatalogs instead of FootprintSets, but I think that's an overall improvement (though it does make the `if doDeblend` branches rather messier).
c17c50e to
2cf1bc2
Compare
No description provided.